The extended Bregman divergence and parametric estimation

نویسندگان

چکیده

Minimization of suitable statistical distances (between the data and model densities) is a useful technique in field robust inference. Apart from class ϕ-divergences, Bregman divergence extensively used for this purpose. However, since density must have linear presence term involving both densities structure, several divergences cannot be captured by usual form. We provide an extension considering exponent function as argument rather than itself. Many divergences, that are not ordinarily can accommodated within extended description. Using formulation, one develop many new families which may In particular, through application extension, we propose GSB family. explore applicability minimum estimator discrete parametric models. Simulation studies real examples provided to demonstrate performance substantiate theory developed.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Penalized Bregman Divergence Estimation via Coordinate Descent

Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...

متن کامل

The Lovász-Bregman Divergence and connections to rank aggregation, clustering, and web ranking: Extended Version

We extend the recently introduced theory of Lovász Bregman (LB) divergences [20] in several ways. We show that they represent a distortion between a “score” and an “ordering”, thus providing a new view of rank aggregation and order based clustering with interesting connections to web ranking. We show how the LB divergences have a number of properties akin to many permutation based metrics, and ...

متن کامل

Bregman and Burbea-rao Divergence for Matrices

In this paper, the Bregman and Burbea-Rao divergences for matrices are investigated. Two mean-value theorems for the divergences induced by C-functions are derived. As application, certain Cauchy type means of the entries of the matrices are constructed. By utilizing three classes of parametrized convex functions, the exponential convexity of the divergences, thought as a function of the parame...

متن کامل

Proper Scoring Rules and Bregman Divergence

Proper scoring rules are a means of evaluating the quality of probabilistic forecasts. They induce dissimilarity measures of probability distributions known as Bregman divergences. We survey the literature on both entities and present their mathematical properties in a unified theoretical framework. This perspective allows us to identify score and Bregman divergences and characterize them toget...

متن کامل

Convex Relaxations of Bregman Divergence Clustering

Although many convex relaxations of clustering have been proposed in the past decade, current formulations remain restricted to spherical Gaussian or discriminative models and are susceptible to imbalanced clusters. To address these shortcomings, we propose a new class of convex relaxations that can be flexibly applied to more general forms of Bregman divergence clustering. By basing these new ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Statistics

سال: 2022

ISSN: ['1029-4910', '0233-1888', '1026-7786']

DOI: https://doi.org/10.1080/02331888.2022.2070622